Structural information in two-dimensional patterns: Entropy convergence and excess entropy
نویسندگان
چکیده
منابع مشابه
Structural information in two-dimensional patterns: entropy convergence and excess entropy.
We develop information-theoretic measures of spatial structure and pattern in more than one dimension. As is well known, the entropy density of a two-dimensional configuration can be efficiently and accurately estimated via a converging sequence of conditional entropies. We show that the manner in which these conditional entropies converge to their asymptotic value serves as a measure of global...
متن کاملentropy, negentropy, and information
evaluation criteria for different versions of the same database the concept of information, during its development, is connected to the concept of entropy created by the 19th century termodynamics scholars. information means, in this view point, order or negentropy. on the other hand, entropy is connected to the concepts as chaos and noise, which cause, in turn, disorder. in the present paper, ...
متن کاملDimensional behaviour of entropy and information
We develop an information-theoretic perspective on some questions in convex geometry, providing for instance a new equipartition property for log-concave probability measures, some Gaussian comparison results for log-concave measures, an entropic formulation of the hyperplane conjecture, and a new reverse entropy power inequality for log-concave measures analogous to V. Milman’s reverse Brunn-M...
متن کاملHolographic entropy bound in two-dimensional gravity
Bousso’s entropy bound for two-dimensional gravity is investigated in the lightcone gauge. It is shown that due to the Weyl anomaly, the null component of the energy-momentum tensor takes a nonvanishing value, and thus, combined with the conditions that were recently proposed by Bousso, Flanagan and Marolf, a holographic entropy bound similar to Bousso’s is expected to hold in two dimensions. A...
متن کاملErgodic decomposition of excess entropy and conditional mutual information∗
The article discusses excess entropy defined as mutual information between the past and future of a stationary process. The central result is an ergodic decomposition: Excess entropy is the sum of self-information of shift-invariant σ-field and the average of excess entropies for the ergodic components of the process. The result is derived using generalized conditional mutual information for fi...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Physical Review E
سال: 2003
ISSN: 1063-651X,1095-3787
DOI: 10.1103/physreve.67.051104